Automatic Differentiation and the Step Computation in the Limited Memory Bfgs Method

نویسندگان

  • JEAN CHARLES GILBERT
  • J. NOCEDAL
چکیده

It is shown that the two-loop recursion for computing the search direction of a limited memory method for optimization can be derived by means of the reverse mode of automatic differentiation applied to an auxiliary function.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Optimization Technology Center towards a Discrete Newton Method with Memory for Large Scale Optimization

A new method for solving large nonlinear optimization problems is outlined It attempts to combine the best properties of the discrete truncated Newton method and the limited memory BFGS method to produce an algorithm that is both economical and capable of handling ill conditioned problems The key idea is to use the curvature information generated during the computation of the discrete Newton st...

متن کامل

Towards a Discrete Newton Method with Memory for Large-scale Optimization

A new method for solving large nonlinear optimization problems is outlined. It attempts to combine the best properties of the discrete-truncated Newton method and the limited memory BFGS method, to produce an algorithm that is both economical and capable of handling ill-conditioned problems. The key idea is to use the curvature information generated during the computation of the discrete Newton...

متن کامل

Towards a Discrete Newton Method with Memory for Large-scale Optimization 1 By

A new method for solving large nonlinear optimization problems is outlined. It attempts to combine the best properties of the discrete-truncated Newton method and the limited memory BFGS method, to produce an algorithm that is both economical and capable of handling ill-conditioned problems. The key idea is to use the curvature information generated during the computation of the discrete Newton...

متن کامل

Title: towards a Discrete Newton Method with Memory for Large-scale Optimization

A new method for solving large nonlinear optimization problems is outlined. It attempts to combine the best properties of the discrete-truncated Newton method and the limited memory BFGS method, to produce an algorithm that is both economical and capable of handling ill-conditioned problems. The key idea is to use the curvature information generated during the computation of the discrete Newton...

متن کامل

Full waveform inversion of crosswell seismic data using automatic differentiation

Full waveform inversion (FWI) is an effective and efficient data-fitting technique that has been widely used to produce accurate estimation of model parameters in Geophysics. The efficiency and accuracy of FWI are determined by the three main components: numerical solution for forward problem, gradient calculation and model update which usually involves the optimization method. The success of t...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2001